On Approximation Algorithm for Orthogonal Low-Rank Tensor Approximation

نویسندگان

چکیده

This work studies solution methods for approximating a given tensor by sum of R rank-1 tensors with one or more the latent factors being orthonormal. Such problem arises from applications such as image processing, joint singular value decomposition, and independent component analysis. Most existing algorithms are iterative type, while approximation type limited. By exploring multilinearity orthogonality problem, we introduce an algorithm in this work. Depending on computation several key subproblems, proposed can be either deterministic randomized. The lower bound is established, both expected senses. ratio depends size tensor, number terms, data. When reduced to case, coincides those literature. Moreover, presented results fill gap left Yang (SIAM J Matrix Anal Appl 41:1797–1825, 2020), where that was established when there only orthonormal factor. Numerical show usefulness algorithm.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-rank Tensor Approximation

Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the...

متن کامل

Relative Error Tensor Low Rank Approximation

We consider relative error low rank approximation of tensors with respect to the Frobenius norm. Namely, given an order-q tensor A ∈ R ∏q i=1 ni , output a rank-k tensor B for which ‖A − B‖F ≤ (1 + ) OPT, where OPT = infrank-k A′ ‖A − A‖F . Despite much success on obtaining relative error low rank approximations for matrices, no such results were known for tensors. One structural issue is that ...

متن کامل

Tensor Low Multilinear Rank Approximation by Structured Matrix Low-Rank Approximation

We present a new connection between higherorder tensors and affinely structured matrices, in the context of low-rank approximation. In particular, we show that the tensor low multilinear rank approximation problem can be reformulated as a structured matrix low-rank approximation, the latter being an extensively studied and well understood problem. We first consider symmetric tensors. Although t...

متن کامل

On the Tensor SVD and the Optimal Low Rank Orthogonal Approximation of Tensors

It is known that a higher order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the Higher Order SVD (HOSVD). In face of these difficulties t...

متن کامل

Approximation Algorithms for l0-Low Rank Approximation

For any column A:,i the best response vector is 1, so A:,i1 T − A 0 = 2 n − 1 = 2(1 − 1/n) OPTF 1 OPTF 1 = n Boolean l0-rank-1 Theorem 3. (Sublinear) Given A ∈ 0,1 m×n with column adjacency arrays and with row and column sums, we can compute w.h.p. in time O min A 0 +m + n, ψB −1 m + n log(mn) vectors u, v such that A − uv 0 ≤ 1 + O ψB OPTB . Theorem 4. (Exact) Given A ∈ 0,1 m×n with OPTB / A 0...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2022

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-022-02050-x